On Generalized Csiszz Ar{kullback Inequalities

نویسندگان

  • Anton Arnold
  • Peter Markowich
  • Giuseppe Toscani
  • Andreas Unterreiter
چکیده

The classical Csiszz ar{Kullback inequality bounds the L 1 {distance of two probability densities in terms of their relative (convex) entropies. Here we generalize such inequalities to not necessarily normalized and possibly non-positive L 1 functions. Also, our generalized Csiszz ar{Kullback inequalities are in many important cases sharper than the classical ones (in terms of the functional dependence of the L 1 bound on the relative entropy). Moreover our construction of these bounds is rather elementary.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Defocusing Nonlinear Schrr Odinger Equation: Connnement, Stability and Asymptotic Stability Keywords. Nonlinear Schrr Odinger Equation { Dispersion {pseudo-con- Formal Law { Time-dependent Rescalings { Stability { Large Time Asymp- Totics { Minimzers { Relative Entropy { Csiszz Ar-kullback Inequality

This paper is devoted to the asymptotic properties of the Nonlinear Schrr odinger equation in the defocusing case. We recover dispersion rates by the mean of time-dependent rescalings in the power law case and prove a new result in the case of the logarithmic Nonlinear Schrr odinger equation. The rescaled equation is then used to obtain an asymptotic result of nonlinear stability, which is the ...

متن کامل

Generalization of Cramér-Rao and Bhattacharyya inequalities for the weighted covariance matrix

The paper considers a family of probability distributions depending on a parameter. The goal is to derive the generalized versions of Cramér-Rao and Bhattacharyya inequalities for the weighted covariance matrix and of the Kullback inequality for the weighted Kullback distance, which are important objects themselves [9, 23, 28]. The asymptotic forms of these inequalities for a particular family ...

متن کامل

Generalized Relative Information and Information Inequalities

In this paper, we have obtained bounds on Csiszár’s f-divergence in terms of relative information of type s using Dragomir’s [9] approach. The results obtained in particular lead us to bounds in terms of χ−Divergence, Kullback-Leibler’s relative information and Hellinger’s discrimination.

متن کامل

Weighted Csiszár-kullback-pinsker Inequalities and Applications to Transportation Inequalities

Abstract. We strengthen the usual Csiszár-Kullback-Pinsker inequality by allowing weights in the total variation norm; admissible weights depend on the decay of the reference probability measure. We use this result to derive transportation inequalities involving Wasserstein distances for various exponents: in particular, we recover the equivalence between a T1 inequality and the existence of a ...

متن کامل

Trace inequalities in nonextensive statistical mechanics

In this short paper, we establish a variational expression of the Tsallis relative entropy. In addition, we derive a generalized thermodynamic inequality and a generalized Peierls-Bogoliubov inequality. Finally we give a generalized Golden-Thompson inequality.

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2000